Conversation
|
@jackspiering I have made the requested changes, please have a look. |
Thank you for the edit. Although the comment in the compose.yaml has not been edited. This could generate an error. Please comment that line out, my comnent is still open. Besides that could you please add Ollama to the general README.md table? Thank you! |
Please have a look, I have edited the comment in the compose.yaml and added Ollama to the general readme table. |
- Add time zone setting for containers in .env - Improve formatting of the configuration table in README
Checked for the requested changes, these are addressed.

Pull Request Title: Add Ollama Service
Description
Adds a Tailscale sidecar configuration for Ollama, a tool for running large language models (LLMs) locally. This lets users access their local models securely from any device on their Tailnet — including phones and remote machines — without exposing the Ollama API to the public internet.
Includes:
compose.yamlfollowing the ScaleTail sidecar pattern (network_mode: service:tailscale, health checks,depends_onwith health condition).envtemplate withSERVICE,IMAGE_URL,TS_AUTHKEY, and an optionalOLLAMA_API_KEYREADME.mdcovering prerequisites, volumes, MagicDNS/HTTPS setup, optional LAN port exposure, and first-run model pull instructionsyourNetworkexternal network on the Tailscale container, enabling other containers (e.g. Open WebUI) to reach Ollama via inter-container networkingRelated Issues
N/A
Type of Change
How Has This Been Tested?
docker compose config— no errors or missing variable warnings.config,ts/state,ollama-data) and started the stack withdocker compose up -d.healthystatus viadocker compose ps.docker exec tailscale-ollama tailscale status.tinyllamaand sent a test generation request viacurlto the Tailnet IP on port 11434 — received a valid JSON response.curltest from a second device on the same Tailnet to confirm remote access works end-to-end.tailscale-ollamaappears indocker network inspect yourNetwork.Checklist
README.md)Screenshots (if applicable)
N/A — no visual UI changes.
Additional Notes
yourNetworknetwork is optional. Users who don't need inter-container communication can remove thenetworks:sections fromcompose.yamlentirely.OLLAMA_KEEP_ALIVEis set to24hby default to keep models warm; users can adjust or remove this to suit their hardware.